A Direct Criterion Minimization Based fMLLR via Gradient Descend

نویسندگان

  • Jan Vanek
  • Zbynek Zajíc
چکیده

Adaptation techniques are necessary in automatic speech recognizers to improve a recognition accuracy. Linear Transformation methods (MLLR or fMLLR) are the most favorite in the case of limited available data. The fMLLR is the feature-space transformation. This is the advantage with contrast to MLLR that transforms the entire acoustic model. The classical fMLLR estimation involves maximization of the likelihood criterion based on individual Gaussian components statistic. We proposed an approach which takes into account the overall likelihood of a HMM state. It estimates the transformation to optimize the ML criterion of HMM directly using gradient descent algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multi-Component-Multiphase Flash Calculations for Systems Containing Gas Hydrates by Direct Minimization of Gibbs Free Energy

The Michelsen stability and multiphase flash calculation by direct minimization of Gibbs free energy of the system at constant temperature and pressure, was used for systems containing gas hydrates. The solid hydrate phase was treated as a solid solution. The fugacities of all components of the hydrate phase were calculated as a function of compositions by the rearranged model of van der Wa...

متن کامل

A factorized representation of FMLLR transform based on QR-decomposition

In this paper, we propose a novel representation of the FMLLR transform. This is different from the standard FMLLR in that the linear transform (LT) is expressed in a factorized form such that each of the factors involves only one parameter. The representation is mainly motivated by QR-decomposition of a square matrix and hence is referred to as QR-FMLLR. The mathematical expressions and steps ...

متن کامل

Sequence error (SE) minimization training of neural network for voice conversion

Neural network (NN) based voice conversion, which employs a nonlinear function to map the features from a source to a target speaker, has been shown to outperform GMM-based voice conversion approach [4-7]. However, there are still limitations to be overcome in NN-based voice conversion, e.g. NN is trained on a Frame Error (FE) minimization criterion and the corresponding weights are adjusted to...

متن کامل

Adaptive Minimum BER Reduced-Rank Interference Suppression Algorithms Based on Joint and Iterative Optimization of Parameters

In this letter, we propose a novel adaptive reducedrank strategy based on joint iterative optimization (JIO) of filters according to the minimization of the bit error rate (BER) cost function. The proposed optimization technique adjusts the weights of a subspace projection matrix and a reduced-rank filter jointly. We develop stochastic gradient (SG) algorithms for their adaptive implementation ...

متن کامل

Gradient-Based Optimization of Hyperparameters

Many machine learning algorithms can be formulated as the minimization of a training criterion that involves a hyperparameter. This hyperparameter is usually chosen by trial and error with a model selection criterion. In this article we present a methodology to optimize several hyperparameters, based on the computation of the gradient of a model selection criterion with respect to the hyperpara...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013